Bootstrap Methods for Inference with Cluster Sample IV Models
نویسندگان
چکیده
منابع مشابه
Cluster-robust Bootstrap Inference in Quantile Regression Models
In this paper I develop a wild bootstrap procedure for cluster-robust inference in linear quantile regression models. I show that the bootstrap leads to asymptotically valid inference on the entire quantile regression process in a setting with a large number of small, heterogeneous clusters and provides consistent estimates of the asymptotic covariance function of that process. The proposed boo...
متن کاملFinite-sample bootstrap inference in GARCH models with heavy-tailed innovations
A general method is proposed for the construction of valid simultaneous confidence sets in the context of stationary GARCH models. The proposed method proceeds by numerically inverting the conventional likelihood ratio test. In order to hedge against the risk of a spurious rejection, candidate points that are rejected by the conventional test undergo a finite-sample parametric bootstrap test. A...
متن کاملWild Bootstrap Inference for Wildly Di erent Cluster Sizes ∗
The cluster robust variance estimator (CRVE) relies on the number of clusters being large. The precise meaning of `large' is ambiguous, but a shorthand `rule of 42' has emerged. We show that this rule is invalid when clusters are not equal-sized. Monte Carlo evidence suggests that rejection frequencies can be much higher when a dataset has 50 clusters proportional to the populations of the US s...
متن کاملThe ROSAT Brightest Cluster Sample ( BCS ) — IV . The extended sample
We present a low-flux extension of the X-ray selected ROSAT Brightest Cluster Sample (BCS) published in Paper I of this series. Like the original BCS and employing an identical selection procedure, the BCS extension is compiled from ROSAT All-Sky Survey (RASS) data in the northern hemisphere (δ ≥ 0 •) and at high Galactic latitudes (|b| ≥ 20 •). It comprises 100 X-ray selected clusters of galax...
متن کاملParametric bootstrap methods for bias correction in linear mixed models
The empirical best linear unbiased predictor (EBLUP) in the linear mixed model (LMM) is useful for the small area estimation, and the estimation of the mean squared error (MSE) of EBLUP is important as a measure of uncertainty of EBLUP. To obtain a second-order unbiased estimator of the MSE, the second-order bias correction has been derived mainly based on Taylor series expansions. However, thi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SSRN Electronic Journal
سال: 2014
ISSN: 1556-5068
DOI: 10.2139/ssrn.2574521